Posterior Consistency of the Silverman g-prior in Bayesian Model Choice

نویسندگان

  • Zhihua Zhang
  • Michael I. Jordan
  • Dit-Yan Yeung
چکیده

Kernel supervised learning methods can be unified by utilizing the tools from regularization theory. The duality between regularization and prior leads to interpreting regularization methods in terms of maximum a posteriori estimation and has motivated Bayesian interpretations of kernel methods. In this paper we pursue a Bayesian interpretation of sparsity in the kernel setting by making use of a mixture of a point-mass distribution and prior that we refer to as “Silverman’s g-prior.” We provide a theoretical analysis of the posterior consistency of a Bayesian model choice procedure based on this prior. We also establish the asymptotic relationship between this procedure and the Bayesian information criterion.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayesian Logistic Regression Model Choice via Laplace-Metropolis Algorithm

Following a Bayesian statistical inference paradigm, we provide an alternative methodology for analyzing a multivariate logistic regression. We use a multivariate normal prior in the Bayesian analysis. We present a unique Bayes estimator associated with a prior which is admissible. The Bayes estimators of the coefficients of the model are obtained via MCMC methods. The proposed procedure...

متن کامل

Improving the Performance of Bayesian Estimation Methods in Estimations of Shift Point and Comparison with MLE Approach

A Bayesian analysis is used to detect a change-point in a sequence of independent random variables from exponential distributions. In This paper, we try to estimate change point which occurs in any sequence of independent exponential observations. The Bayes estimators are derived for change point, the rate of exponential distribution before shift and the rate of exponential distribution after s...

متن کامل

Benchmark Priors for Bayesian Model Averaging

In contrast to a posterior analysis given a particular sampling model, posterior model probabilities in the context of model uncertainty are typically rather sensitive to the specification of the prior. In particular, “diffuse” priors on model-specific parameters can lead to quite unexpected consequences. Here we focus on the practically relevant situation where we need to entertain a (large) n...

متن کامل

Comparison between Frequentist Test and Bayesian Test to Variance Normal in the Presence of Nuisance Parameter: One-sided and Two-sided Hypothesis

 This article is concerned with the comparison P-value and Bayesian measure for the variance of Normal distribution with mean as nuisance paramete. Firstly, the P-value of null hypothesis is compared with the posterior probability when we used a fixed prior distribution and the sample size increases. In second stage the P-value is compared with the lower bound of posterior probability when the ...

متن کامل

Dirichlet Prior Sieves in Finite Normal Mixtures

The use of a finite dimensional Dirichlet prior in the finite normal mixture model has the effect of acting like a Bayesian method of sieves. Posterior consistency is directly related to the dimension of the sieve and the choice of the Dirichlet parameters in the prior. We find that naive use of the popular uniform Dirichlet prior leads to an inconsistent posterior. However, a simple adjustment...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008